NVIDIA’s creative AI allows players to chat with NPCs
NVIDIA has unveiled a technology called Avatar Cloud Engine (ACE) that allows players to naturally speak to non-playable characters (NPCs) and receive appropriate responses. The company showed off the technology during Computex 2023’s artificial intelligence generative keynote, with a demo called Kairos in which a playable character talked to an NPC named Jin in a seedy-looking ramen shop.
The demo (below 32:9, the biggest widescreen I’ve ever seen) shows the player talking to Jane. “Hi Jane, how are you,” the person asks. “Unfortunately, not very well,” Jin replied. “How is that?” “I’m worried about the crime here. It’s gotten really bad lately. My ramen shop has been caught in the crossfire.”
Yes, the dialogue is a bit wooden; It looks like ChatGPT could have done a better job. However, the idea is to show that you can talk into the headset and the NPC will respond in the appropriate context, making the interaction more natural than usual in this type of situation.
NVIDIA did a demo in collaboration with Convai to promote ACE, which can run in the cloud and on-premises (on NVIDIA hardware, natch). It uses NVIDIA NeMo to build, customize, and deploy large language models that can be customized for historical backstories and personalities, while using a firewall to protect against inappropriate conversations. It also uses speech recognition and a speech-to-text conversion tool called Riva, along with NVIDIA’s Omniverse Audio2Face “to create the game character’s expressive facial animations to match any speech path.”
The demo was built in Unreal Engine 5 to showcase NVIDIA’s ray tracing and other GPU features. The visuals are actually more convincing than the AI dialogue, though it’s easy to see how the latter could be vastly improved. NVIDIA hasn’t announced which games will use the technology, but Stalker 2: Heart of Chernobyl and Fort Solis use Omniverse Audio2Face.